Recall last note, Covariance matrix is defined as
For all fixed ,
here .
Positive (semi-)definite matrix
Let be a real-valued symmetric matrix, i.e. .
is called positive definite, if any non-zero vector :
All eigenvalues are real and positive.
is non-singular (invertible).
is called positive semi-definite, if .
All eigenvalues are real and non-negative.
Thus covariance matrix must be positive semi-definite.
Theorem
is positive semi-definite if and only if where is some real square matrix.
is positive definite if and only if where is some real non-singular square matrix.
Proof (Eigenvector decomposition)
When is positive (semi-)definite, the eigenvectors of can be chosen to be an orthogonal basis of . Then , where . And th column of is , . Since semi-definite, , so we can define , soand let . ( is called square-root matrix of .)
Let be a positive semi-definite matrix, and let be its square-root matrix. Then , where and .
More generally,
is a covariance matrix if and only if is positive semi-definite.
is a non-singular covariance matrix if and only if is positive definite.
When is non-singular, is invertible. By the proof we have , so
Where .
So if , where is positive definite, then we can do standardization
MGF
1-dim: . .
-dim: . is a univariate normal RV, with
Multivariate normal & distribution
Idempotent
A square matrix is said to be idempotent if .
Fact
is a symmetric and idempotent matrix of rank iff where are orthogonal vectors in .
Theorem
Suppose and is an symmetric matrix. If is idempotent with rank , then .
In fact, the converse is also true.
Proof
, where are orthonormal. SoThereforeThe theorem now follows from the fact that if .
The above result has many applications in Statistics.
E.g., . . Then, the above result can be used to show